Search results for "Sliced inverse regression"
showing 6 items of 6 documents
A semiparametric approach to estimate reference curves for biophysical properties of the skin
2006
Reference curves which take one covariable into account such as the age, are often required in medicine, but simple systematic and efficient statistical methods for constructing them are lacking. Classical methods are based on parametric fitting (polynomial curves). In this chapter, we describe a new methodology for the estimation of reference curves for data sets, based on nonparametric estimation of conditional quantiles. The derived method should be applicable to all clinical or more generally biological variables that are measured on a continuous quantitative scale. To avoid the curse of dimensionality when the covariate is multidimensional, a new semiparametric approach is proposed. Th…
Some extensions of multivariate sliced inverse regression
2007
Multivariate sliced inverse regression (SIR) is a method for achieving dimension reduction in regression problems when the outcome variable y and the regressor x are both assumed to be multidimensional. In this paper, we extend the existing approaches, based on the usual SIR I which only uses the inverse regression curve, to methods using properties of the inverse conditional variance. Contrary to the existing ones, these new methods are not blind for symmetric dependencies and rely on the SIR II or SIRα. We also propose their corresponding pooled slicing versions. We illustrate the usefulness of these approaches on simulation studies.
A Statistical Matrix Representation Using Sliced Orthogonal Nonlinear Correlations for Pattern Recognition
2000
In pattern recognition, the choice of features to be detected is a critical factor to determine the success or failure of a method; much research has gone into finding the best features for particular tasks [1]. When images are detected by digital cameras, they are usually acquired as rectangular arrays of pixels, so the initial features are pixel values. Some methods use those pixel values directly for processing, for instance in normal matched filtering [2], whereas other methods execute some degree of pre-processing, such as binarizing the pixel values [3].
Asymptotic and bootstrap tests for subspace dimension
2022
Most linear dimension reduction methods proposed in the literature can be formulated using an appropriate pair of scatter matrices, see e.g. Ye and Weiss (2003), Tyler et al. (2009), Bura and Yang (2011), Liski et al. (2014) and Luo and Li (2016). The eigen-decomposition of one scatter matrix with respect to another is then often used to determine the dimension of the signal subspace and to separate signal and noise parts of the data. Three popular dimension reduction methods, namely principal component analysis (PCA), fourth order blind identification (FOBI) and sliced inverse regression (SIR) are considered in detail and the first two moments of subsets of the eigenvalues are used to test…
Asymptotics for pooled marginal slicing estimator based on SIRα approach
2005
Pooled marginal slicing (PMS) is a semiparametric method, based on sliced inverse regression (SIR) approach, for achieving dimension reduction in regression problems when the outcome variable y and the regressor x are both assumed to be multidimensional. In this paper, we consider the SIR"@a version (combining the SIR-I and SIR-II approaches) of the PMS estimator and we establish the asymptotic distribution of the estimated matrix of interest. Then the asymptotic normality of the eigenprojector on the estimated effective dimension reduction (e.d.r.) space is derived as well as the asymptotic distributions of each estimated e.d.r. direction and its corresponding eigenvalue.
On the usage of joint diagonalization in multivariate statistics
2022
Scatter matrices generalize the covariance matrix and are useful in many multivariate data analysis methods, including well-known principal component analysis (PCA), which is based on the diagonalization of the covariance matrix. The simultaneous diagonalization of two or more scatter matrices goes beyond PCA and is used more and more often. In this paper, we offer an overview of many methods that are based on a joint diagonalization. These methods range from the unsupervised context with invariant coordinate selection and blind source separation, which includes independent component analysis, to the supervised context with discriminant analysis and sliced inverse regression. They also enco…